Approaches Based on Markovian Architectural Bias in Recurrent Neural Networks
نویسندگان
چکیده
Recent studies show that state-space dynamics of randomly initialized recurrent neural network (RNN) has interesting and potentially useful properties even without training. More precisely, when initializing RNN with small weights, recurrent unit activities reflect history of inputs presented to the network according to the Markovian scheme. This property of RNN is called Markovian architectural bias. Our work focuses on various techniques that make use of architectural bias. The first technique is based on the substitution of RNN output layer with prediction model, resulting in capabilities to exploit interesting state representation. The second approach, known as echo state networks (ESNs), is based on large untrained randomly interconnected hidden layer, which serves as reservoir of interesting behavior. We have investigated both approaches and their combination and performed simulations to demonstrate their usefulness.
منابع مشابه
Text Correction Using Approaches Based on Markovian Architectural Bias
Several authors have reported interesting results obtained by using untrained randomly initialized recurrent part of an recurrent neural network (RNN). Instead of long, difficult and often unnecessary adaptation process, dynamics based on fixed point attractors can be rich enough for further exploitation for some tasks. The principle explaining untrained RNN state space structure is called Mark...
متن کاملComparison of Echo State Networks with Simple Recurrent Networks and Variable-Length Markov Models on Symbolic Sequences
A lot of attention is now being focused on connectionist models known under the name “reservoir computing”. The most prominent example of these approaches is a recurrent neural network architecture called an echo state network (ESN). ESNs were successfully applied in more real-valued time series modeling tasks and performed exceptionally well. Also using ESNs for processing symbolic sequences s...
متن کاملMarkovian Bias of Neural-based Architectures With Feedback Connections
Dynamic neural network architectures can deal naturally with sequential data through recursive processing enabled by feedback connections. We show how such architectures are predisposed for suffix-based Markovian input sequence representations in both supervised and unsupervised learning scenarios. In particular, in the context of such architectural predispositions, we study computational and l...
متن کاملRobust stability of fuzzy Markov type Cohen-Grossberg neural networks by delay decomposition approach
In this paper, we investigate the delay-dependent robust stability of fuzzy Cohen-Grossberg neural networks with Markovian jumping parameter and mixed time varying delays by delay decomposition method. A new Lyapunov-Krasovskii functional (LKF) is constructed by nonuniformly dividing discrete delay interval into multiple subinterval, and choosing proper functionals with different weighting matr...
متن کامل